A Generalized Univariate Newton Method Motivated by Proximal Regularization
نویسندگان
چکیده
We devise a new generalized univariate Newton method for solving nonlinear equations, motivated by Bregman distances and proximal regularization of optimization problems. We prove quadratic convergence of the new method, a special instance of which is the classical Newton’s method. We illustrate the possible benefits of the new method over classical Newton’s method by means of test problems involving the Lambert W function, Kullback-Leibler distance and a polynomial. These test problems provide insight as to which instance of the generalized method could be chosen for a given nonlinear equation. Finally, we derive a closed-form expression for the asymptotic error constant of the generalized method and make further comparisons involving this constant.
منابع مشابه
A preconditioning proximal Newton method for nondifferentiable convex optimization
We propose a proximal Newton method for solving nondiieren-tiable convex optimization. This method combines the generalized Newton method with Rockafellar's proximal point algorithm. At each step, the proximal point is found approximately and the regu-larization matrix is preconditioned to overcome inexactness of this approximation. We show that such a preconditioning is possible within some ac...
متن کاملA Proximal Point Algorithm for Log-Determinant Optimization with Group Lasso Regularization
We consider the covariance selection problem where variables are clustered into groups and the inverse covariance matrix is expected to have a blockwise sparse structure. This problem is realized via penalizing the maximum likelihood estimation of the inverse covariance matrix by group Lasso regularization. We propose to solve the resulting log-determinant optimization problem with the classica...
متن کاملProximal Quasi-Newton Methods for Nondifferentiable Convex Optimization
Some global convergence properties of a variable metric algorithm for minimization without exact line searches, in R. 23 superlinear convergent algorithm for minimizing the Moreau-Yosida regularization F. However, this algorithm makes use of the generalized Jacobian of F, instead of matrices B k generated by a quasi-Newton formula. Moreover, the line search is performed on the function F , rath...
متن کاملGeneralized Newton Methods for the 2d-signorini Contact Problem with Friction in Function Space
The 2D-Signorini contact problem with Tresca and Coulomb friction is discussed in infinitedimensional Hilbert spaces. First, the problem with given friction (Tresca friction) is considered. It leads to a constraint non-differentiable minimization problem. By means of the Fenchel duality theorem this problem can be transformed into a constrained minimization involving a smooth functional. A regu...
متن کاملA Newton Root-Finding Algorithm For Estimating the Regularization Parameter For Solving Ill-Conditioned Least Squares Problems
We discuss the solution of numerically ill-posed overdetermined systems of equations using Tikhonov a-priori-based regularization. When the noise distribution on the measured data is available to appropriately weight the fidelity term, and the regularization is assumed to be weighted by inverse covariance information on the model parameters, the underlying cost functional becomes a random varia...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- J. Optimization Theory and Applications
دوره 155 شماره
صفحات -
تاریخ انتشار 2012